Sharp thresholds for high-dimensional and noisy sparsity recovery using l1-constrained quadratic programming (Lasso)

نویسنده

  • Martin J. Wainwright
چکیده

The problem of consistently estimating the sparsity pattern of a vector β ∈ R based on observations contaminated by noise arises in various contexts, including signal denoising, sparse approximation, compressed sensing, and model selection. We analyze the behavior of l1-constrained quadratic programming (QP), also referred to as the Lasso, for recovering the sparsity pattern. Our main result is to establish precise conditions on the problem dimension p, the number k of non-zero elements in β, and the number of observations n that are necessary and sufficient for sparsity pattern recovery using the Lasso. We first analyze the case of observations made using deterministic design matrices and sub-Gaussian additive noise, and provide sufficient conditions for support recovery and l∞-error bounds, as well as results showing the necessity of incoherence and bounds on the minimum value. We then turn to the case of random designs, in which each row of the design is drawn from a N(0,Σ) ensemble. For a broad class of Gaussian ensembles satisfying mutual incoherence conditions, we compute explicit values of thresholds 0 < θl(Σ) ≤ θu(Σ) < +∞ with the following properties: for any δ > 0, if n > 2 (θu+ δ) k log(p−k), then the Lasso succeeds in recovering the sparsity pattern with probability converging to one for large problems, whereas for n < 2 (θl − δ) k log(p − k), then the probability of successful recovery converges to zero. For the special case of the uniform Gaussian ensemble (Σ = Ip×p), we show that θl = θu = 1, so that the precise threshold n = 2 k log(p− k) is exactly determined.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sharp thresholds for high-dimensional and noisy recovery of sparsity using l1-constrained quadratic programming

The problem of consistently estimating the sparsity pattern of a vector β∗ ∈ R based on observations contaminated by noise arises in various contexts, including signal denoising, sparse approximation, compressed sensing, and model selection. We analyze the behavior of l1-constrained quadratic programming (QP), also referred to as the Lasso, for recovering the sparsity pattern. Our main result i...

متن کامل

Sharp thresholds for high-dimensional and noisy recovery of sparsity

The problem of consistently estimating the sparsity pattern of a vector β∗ ∈ R based on observations contaminated by noise arises in various contexts, including subset selection in regression, structure estimation in graphical models, sparse approximation, and signal denoising. We analyze the behavior of l1-constrained quadratic programming (QP), also referred to as the Lasso, for recovering th...

متن کامل

Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using -Constrained Quadratic Programming (Lasso)

The problem of consistently estimating the sparsity pattern of a vector based on observations contaminated by noise arises in various contexts, including signal denoising, sparse approximation, compressed sensing, and model selection. We analyze the behavior of -constrained quadratic programming (QP), also referred to as the Lasso, for recovering the sparsity pattern. Our main result is to esta...

متن کامل

A Sharp Sufficient Condition for Sparsity Pattern Recovery

Sufficient number of linear and noisy measurements for exact and approximate sparsity pattern/support set recovery in the high dimensional setting is derived. Although this problem as been addressed in the recent literature, there is still considerable gaps between those results and the exact limits of the perfect support set recovery. To reduce this gap, in this paper, the sufficient con...

متن کامل

Sharp Support Recovery from Noisy Random Measurements by L1 minimization

In this paper, we investigate the theoretical guarantees of penalized l1-minimization (also called Basis Pursuit Denoising or Lasso) in terms of sparsity pattern recovery (support and sign consistency) from noisy measurements with non-necessarily random noise, when the sensing operator belongs to the Gaussian ensemble (i.e. random design matrix with i.i.d. Gaussian entries). More precisely, we ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEEE Trans. Information Theory

دوره 55  شماره 

صفحات  -

تاریخ انتشار 2009